
Applying The Blueprint for an AI Bill of Rights
DEFINITIONS
ALGORITHMIC DISCRIMINATION: “Algorithmic discrimination” occurs when automated systems
contribute to unjustified different treatment or impacts disfavoring people based on their race, color, ethnicity,
sex (including pregnancy, childbirth, and related medical conditions, gender identity, intersex status, and sexual
orientation), religion, age, national origin, disability, veteran status, genetic information, or any other classifica-
tion protected by law. Depending on the specific circumstances, such algorithmic discrimination may violate
legal protections. Throughout this framework the term “algorithmic discrimination” takes this meaning (and
not a technical understanding of discrimination as distinguishing between items).
AUTOMATED SYSTEM: An "automated system" is any system, software, or process that uses computation as
whole or part of a system to determine outcomes, make or aid decisions, inform policy implementation, collect
data or observations, or otherwise interact with individuals and/or communities. Automated systems
include, but are not limited to, systems derived from machine learning, statistics, or other data processing
or artificial intelligence techniques, and exclude passive computing infrastructure. “Passive computing
infrastructure” is any intermediary technology that does not influence or determine the outcome of decision,
make or aid in decisions, inform policy implementation, or collect data or observations, including web
hosting, domain registration, networking, caching, data storage, or cybersecurity. Throughout this
framework, automated systems that are considered in scope are only those that have the potential to
meaningfully impact individuals’ or communi-ties’ rights, opportunities, or access.
COMMUNITIES: “Communities” include: neighborhoods; social network connections (both online and
offline); families (construed broadly); people connected by affinity, identity, or shared traits; and formal organi-
zational ties. This includes Tribes, Clans, Bands, Rancherias, Villages, and other Indigenous communities. AI
and other data-driven automated systems most directly collect data on, make inferences about, and may cause
harm to individuals. But the overall magnitude of their impacts may be most readily visible at the level of com-
munities. Accordingly, the concept of community is integral to the scope of the Blueprint for an AI Bill of Rights.
United States law and policy have long employed approaches for protecting the rights of individuals, but exist-
ing frameworks have sometimes struggled to provide protections when effects manifest most clearly at a com-
munity level. For these reasons, the Blueprint for an AI Bill of Rights asserts that the harms of automated
systems should be evaluated, protected against, and redressed at both the individual and community levels.
EQUITY: “Equity” means the consistent and systematic fair, just, and impartial treatment of all individuals.
Systemic, fair, and just treatment must take into account the status of individuals who belong to underserved
communities that have been denied such treatment, such as Black, Latino, and Indigenous and Native American
persons, Asian Americans and Pacific Islanders and other persons of color; members of religious minorities;
women, girls, and non-binary people; lesbian, gay, bisexual, transgender, queer, and intersex (LGBTQI+)
persons; older adults; persons with disabilities; persons who live in rural areas; and persons otherwise adversely
affected by persistent poverty or inequality.
RIGHTS, OPPORTUNITIES, OR ACCESS: “Rights, opportunities, or access” is used to indicate the scoping
of this framework. It describes the set of: civil rights, civil liberties, and privacy, including freedom of speech,
voting, and protections from discrimination, excessive punishment, unlawful surveillance, and violations of
privacy and other freedoms in both public and private sector contexts; equal opportunities, including equitable
access to education, housing, credit, employment, and other programs; or, access to critical resources or
services, such as healthcare, financial services, safety, social services, non-deceptive information about goods
and services, and government benefits.